Comparison of Ensemble Approaches: Mixture of Experts and AdaBoost for a Regression Problem
نویسندگان
چکیده
Two machine learning approaches: mixture of experts and AdaBoost.R2 were adjusted to the real-world regression problem of predicting the prices of residential premises based on historical data of sales/purchase transactions. The computationally intensive experiments were conducted aimed to compare empirically the prediction accuracy of ensemble models generated by the methods. The analysis of the results was performed using statistical methodology including nonparametric tests followed by post-hoc procedures designed especially for multiple n×n comparisons. No statistically significant differences were observed among the best ensembles: two generated by mixture of experts and two by AdaBoost.R2 employing multilayer perceptrons and general linear models as base learning algorithms.
منابع مشابه
ADABOOST ENSEMBLE ALGORITHMS FOR BREAST CANCER CLASSIFICATION
With an advance in technologies, different tumor features have been collected for Breast Cancer (BC) diagnosis, processing of dealing with large data set suffers some challenges which include high storage capacity and time require for accessing and processing. The objective of this paper is to classify BC based on the extracted tumor features. To extract useful information and diagnose the tumo...
متن کاملSolving Regression Problems Using Competitive Ensemble Models
The use of ensemble models in many problem domains has increased significantly in the last few years. The ensemble modeling, in particularly boosting, has shown a great promise in improving predictive performance of a model. Combining the ensemble members is normally done in a co–operative fashion where each of the ensemble members performs the same task and their predictions are aggregated to ...
متن کاملافزایش نرخ کارایی طبقه بندی با استفاده از تجمیع ویژگی های موثر روش های مختلف ترکیب شبکه های عصبی
Both theoretical and experimental studies have shown that combining accurate Neural Networks (NN) in the ensemble with negative error correlation greatly improves their generalization abilities. Negative Correlation Learning (NCL) and Mixture of Experts (ME), two popular combining methods, each employ different special error functions for the simultaneous training of NN experts to produce negat...
متن کاملMixture of Experts for Persian handwritten word recognition
This paper presents the results of Persian handwritten word recognition based on Mixture of Experts technique. In the basic form of ME the problem space is automatically divided into several subspaces for the experts, and the outputs of experts are combined by a gating network. In our proposed model, we used Mixture of Experts Multi Layered Perceptrons with Momentum term, in the classification ...
متن کاملParallel Online Continuous Arcing with a Mixture of Neural Networks
This paper presents a new arcing (boosting) algorithm called POCA, Parallel Online Continuous Arcing. Unlike traditional arcing algorithms (such as Adaboost), which construct an ensemble by adding and training weak learners sequentially on a round-byround basis, training in POCA is performed over an entire ensemble continuously and in parallel. Since members of the ensemble are not frozen after...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014